Nanodegree key: nd889
Version: 1.0.0
Locale: en-us
Become an expert in the core concepts of artificial intelligence and learn how to apply them to real-life problems.
Content
Part 01 : Introduction to Artificial Intelligence
In this Term, you'll learn the foundations of AI with Sebastian Thrun, Peter Norvig, and Thad Starner. We'll cover Game-Playing, Search, Optimization, Probabilistic AIs, and Hidden Markov Models.
-
Module 01: Introduction to the Nanodegree
-
Lesson 01: Welcome to Artificial Intelligence
Welcome to Term 1 of the Artificial Intelligence Nanodegree program!
- Concept 01: Welcome to the Artificial Intelligence Nanodegree Program
- Concept 02: Meet Your Instructors
- Concept 03: Projects You Will Build
- Concept 04: Deadline Policy
- Concept 05: Udacity Support
- Concept 06: Community Guidelines
- Concept 07: Weekly Lesson Plans
- Concept 08: References & Resources
- Concept 09: Get Started
- Concept 10: Lesson Plan: Week 1
-
Lesson 02: Introduction to AI
An introduction to basic AI concepts with real world examples.
- Concept 01: Welcome to AI!
- Concept 02: Navigation
- Concept 03: Game Playing
- Concept 04: Quiz: Tic Tac Toe
- Concept 05: Tic Tac Toe: Heuristics
- Concept 06: Quiz: Monty Hall Problem
- Concept 07: Monty Hall Problem: Explained
- Concept 08: Quiz: What is Intelligence?
- Concept 09: Defining Intelligence
- Concept 10: Agent, Environment And State
- Concept 11: Perception, Action and Cognition
- Concept 12: Quiz: Types of AI Problems
- Concept 13: Rational Behavior And Bounded Optimality
-
Lesson 03: Applying AI to Sudoku
In this lesson, you'll get taste of the power of Artificial Intelligence by developing an algorithm to solve every Sudoku puzzle. Enjoy the fun of building your first AI agent and get coding!
- Concept 01: Intro
- Concept 02: Solving a Sudoku
- Concept 03: Setting up the Board
- Concept 04: Encoding the Board
- Concept 05: Strategy 1: Elimination
- Concept 06: Strategy 2: Only Choice
- Concept 07: Constraint Propagation
- Concept 08: Harder Sudoku
- Concept 09: Strategy 3: Search
- Concept 10: Coding the Solution
-
Lesson 04: Setting up with Anaconda
Get your environment set up using Anaconda, an extremely popular way to manage your environments and packages in python.
-
-
Module 02: Search and Optimization
-
Lesson 01: Introduction to Game Playing
In this lesson, you'll learn about how to build a Game Playing AI Agent. You'll focus on an agent that wins in the board game Isolation! You'll learn some of the seminal techniques in AI Game Playing including Adversarial Search and Minimax Trees.
- Concept 01: Lesson Plan: Week 2
- Concept 02: Course Introduction
- Concept 03: Overview
- Concept 04: The Minimax Algorithm
- Concept 05: Isolation
- Concept 06: Building a Game Tree
- Concept 07: Coding: Building a Game Class
- Concept 08: Which of These Are Valid Moves?
- Concept 09: Coding: Game Class Functionality
- Concept 10: Building a Game Tree (Contd.)
- Concept 11: Isolation Game Tree with Leaf Values
- Concept 12: How Do We Tell the Computer Not to Lose?
- Concept 13: MIN and MAX Levels
- Concept 14: Coding: Scoring Min & Max Levels
- Concept 15: Propagating Values Up the Tree
- Concept 16: Computing MIN MAX Values
- Concept 17: Computing MIN MAX Solution
- Concept 18: Choosing the Best Branch
- Concept 19: Coding: Minimax Search
- Concept 20: (DEPRECATED) Mini-Project: Coding Minimax
- Concept 21: Searching Simple Games Reading
- Concept 22: Lesson Plan: Week 3
- Concept 23: Max Number of Nodes Visited
- Concept 24: Max Moves
- Concept 25: The Branching Factor
- Concept 26: Number of Nodes in a Game Tree
- Concept 27: The Branching Factor (Contd.)
- Concept 28: Max Number of Nodes
- Concept 29: Depth-Limited Search
- Concept 30: Coding: Depth-Limited Minimax
- Concept 31: Evaluation Function Intro
- Concept 32: Testing the Evaluation Function
- Concept 33: Testing the Evaluation Function Part 2
- Concept 34: Testing Evaluation Functions
- Concept 35: Testing the Evaluation Function Part 3
- Concept 36: Quiescent Search
-
Lesson 02: Advanced Game Playing
In this lesson, you'll build a Game-Playing agent that defeats opponents in Isolation. Along the way, you'll learn about advanced Game-Playing techniques such as Iterative Deepening, Alpha-Beta Pruning, and Expectimax.
- Concept 01: A Problem
- Concept 02: Iterative Deepening
- Concept 03: Understanding Exponential Time
- Concept 04: Exponential b=3
- Concept 05: Varying the Branching Factor
- Concept 06: Horizon Effect
- Concept 07: Horizon Effect (Contd.)
- Concept 08: Good Evaluation Functions
- Concept 09: Evaluating Evaluation Functions
- Concept 10: Alpha-Beta Pruning
- Concept 11: Minimax Quiz
- Concept 12: Alpha-Beta Pruning Quiz 1
- Concept 13: Alpha-Beta Pruning Quiz 2
- Concept 14: Thad’s Asides
- Concept 15: Searching Complex Games Reading
- Concept 16: Lesson Plan: Week 4
- Concept 17: Solving 5x5 Isolation
- Concept 18: 3-Player Games
- Concept 19: 3-Player Games Quiz
- Concept 20: 3-Player Alpha-Beta Pruning
- Concept 21: Multi-player Alpha-Beta Pruning Reading
- Concept 22: Probabilistic Games
- Concept 23: Sloppy Isolation
- Concept 24: Sloppy Isolation Expectimax
- Concept 25: Expectimax Alpha-Beta Pruning
- Concept 26: Probabilistic Alpha-Beta Pruning
- Concept 27: Improving Minimax
-
Lesson 04: Search
In this lesson, you'll learn how to implement some of the seminal search algorithms that are a cornerstone of AI including Breadth-First Search, Depth-First Search, and finally A Star Search. You'll then put your skills to the test by teaching Pac-Man to navigate his world and complete complex tasks such as finding the fastest path through the map while maximizing points scored.
- Concept 01: Lesson Plan: Week 5
- Concept 02: Introducing Peter Norvig
- Concept 03: Introduction
- Concept 04: What Is A Problem?
- Concept 05: Example: Route Finding
- Concept 06: Quiz: Tree Search
- Concept 07: Tree Search Continued
- Concept 08: Quiz: Graph Search
- Concept 09: Quiz: Breadth First Search 1
- Concept 10: Breadth First Search 2
- Concept 11: Quiz: Breadth First Search 3
- Concept 12: Breadth First Search 4
- Concept 13: Breadth First Search 5
- Concept 14: Quiz: Uniform Cost Search
- Concept 15: Quiz: Uniform Cost Search 1
- Concept 16: Quiz: Uniform Cost Search 2
- Concept 17: Quiz: Uniform Cost Search 3
- Concept 18: Quiz: Uniform Cost Search 4
- Concept 19: Uniform Cost Search 5
- Concept 20: Quiz: Search Comparison
- Concept 21: Search Comparison 1
- Concept 22: Quiz: Search Comparison 2
- Concept 23: Search Comparison 3
- Concept 24: On Uniform Cost
- Concept 25: Quiz: A* Search
- Concept 26: Quiz: A* Search 1
- Concept 27: Quiz: A* Search 2
- Concept 28: Quiz: A* Search 3
- Concept 29: Quiz: A* Search 4
- Concept 30: A* Search 5
- Concept 31: Optimistic Heuristic
- Concept 32: Quiz: State Spaces
- Concept 33: State Spaces 1
- Concept 34: Quiz: State Spaces 2
- Concept 35: State Spaces 3
- Concept 36: Quiz: Sliding Blocks Puzzle
- Concept 37: Sliding Blocks Puzzle 1
- Concept 38: Sliding Blocks Puzzle 2
- Concept 39: Problems with Search
- Concept 40: A Note on Implementation
- Concept 41: Peter's take on AI
- Concept 42: Introduction to the Lab
- Concept 43: Exercise: Teaching Pac-Man to Search (Optional)
- Concept 44: (Coding) Workspace: Pacman Search
-
Lesson 05: Simulated Annealing
In this lesson, you'll learn how to explore spaces and avoid local optima by using Simulated Annealing. In the process, you'll solve the famous n-Queens problem using this advanced AI technique!
- Concept 01: Lesson Plan: Week 6
- Concept 02: Introduction to Simulated Annealing
- Concept 03: Iterative Improvement Problems: TSP
- Concept 04: 4-Queens
- Concept 05: 5-Queens Quiz
- Concept 06: n-Queens Heuristic Function
- Concept 07: n-Queens Local Minima
- Concept 08: Hill Climbing
- Concept 09: Local Maximum
- Concept 10: Random Restart
- Concept 11: Hill Climbing Quiz
- Concept 12: Step Size Too Small
- Concept 13: Step Size Too Large
- Concept 14: Hill Climbing Quiz 2
- Concept 15: Annealing
- Concept 16: Simulated Annealing
- Concept 17: Simulated Simulated Annealing
- Concept 18: Local Beam Search
- Concept 19: Representing n-Queens
- Concept 20: 8-Queens Representation
- Concept 21: Genetic Algorithms
- Concept 22: GA Crossover
- Concept 23: GA Mutation
- Concept 24: GA Crossover Quiz
- Concept 25: Similarities Between Optimizers
- Concept 26: Challenge Question Revisited
- Concept 27: Readings on Simulated Annealing
- Concept 28: Lab: Simulated Annealing (Optional)
-
Lesson 06: Constraint Satisfaction
In this lesson we'll return to one of the main techniques we used to solve Sudoku - constraint propagation. We'll see how to use known constraints to solve a wide variety of problems including Map Coloring problems, and simple puzzles.
- Concept 01: Lesson Plan: Week 7
- Concept 02: Introduction
- Concept 03: Map Coloring
- Concept 04: Constraint Graph
- Concept 05: Map Coloring Quiz
- Concept 06: Constraint Types
- Concept 07: Backtracking Search
- Concept 08: Improving Backtracking Efficiency
- Concept 09: Backtracking Optimization Quiz
- Concept 10: Forward Checking
- Concept 11: Constraint Propagation and Arc Consistency
- Concept 12: Constraint Propagation Quiz
- Concept 13: Structured CSPs
- Concept 14: Iterative Algorithms
- Concept 15: Readings on Constraint Satisfaction
- Concept 16: Lab: Constraint Satisfaction Problems (Optional)
-
-
Module 03: Logic, Reasoning, and Planning
-
Lesson 01: Logic and Reasoning
In this lesson you'll learn to build intelligent systems that can reason using logic! This is in many ways one of the foundational pieces of intelligence - the ability to arrive at new conclusions from a given set of facts.
- Concept 01: Introduction
- Concept 02: Background and Expert Systems
- Concept 03: Propositional Logic
- Concept 04: Truth Tables
- Concept 05: Truth Table Question
- Concept 06: Propositional Logic Question
- Concept 07: Terminology
- Concept 08: Propositional Logic Limitations
- Concept 09: First Order Logic
- Concept 10: Models
- Concept 11: Syntax
- Concept 12: Vacuum World
- Concept 13: FOL Question
- Concept 14: FOL Question 2
-
Lesson 02: Planning
Explore how we can use logic and search to plan out complex itineraries. Many of these planning approaches are the same ones used to power Self-Driving Cars!
- Concept 01: Lesson Plan: Week 8
- Concept 02: Problem Solving vs Planning
- Concept 03: Planning vs Execution
- Concept 04: Vacuum Cleaner Example
- Concept 05: Quiz: Sensorless Vacuum Cleaner Problem
- Concept 06: Partially Observable Vacuum Cleaner Example
- Concept 07: Quiz: Stochastic Environment Problem
- Concept 08: Infinite Sequences
- Concept 09: Finding a Successful Plan
- Concept 10: Quiz: Finding a Successful Plan Question
- Concept 11: Problem Solving via Mathematical Notation
- Concept 12: Tracking the-Predict Update Cycle
- Concept 13: Classical Planning 1
- Concept 14: Classical Planning 2
- Concept 15: Progression Search
- Concept 16: Regression Search
- Concept 17: Regression vs Progression
- Concept 18: Plan Space Search
- Concept 19: Sliding Puzzle Example
- Concept 20: Situation Calculus 1
- Concept 21: Situation Calculus 2
- Concept 22: Situation Calculus 3
-
-
Module 04: Probabilistic Models
-
Lesson 01: Probability
Learn to model uncertainty in the real world using probability theory.
- Concept 01: Lesson Plan: Week 9
- Concept 02: Intro to Probability and Bayes Nets
- Concept 03: Quiz: Probability / Coin Flip
- Concept 04: Quiz: Coin Flip 2
- Concept 05: Quiz: Coin Flip 3
- Concept 06: Quiz: Coin Flip 4
- Concept 07: Quiz: Coin Flip 5
- Concept 08: Probability Summary
- Concept 09: Quiz: Dependence
- Concept 10: What We Learned
- Concept 11: Quiz: Weather
- Concept 12: Quiz: Weather 2
- Concept 13: Quiz: Weather 3
- Concept 14: Quiz: Cancer
- Concept 15: Quiz: Cancer 2
- Concept 16: Quiz: Cancer 3
- Concept 17: Quiz: Cancer 4
- Concept 18: Bayes Rule
- Concept 19: Readings on Probability
-
Lesson 02: Bayes Nets
Learn to encode probability distributions using compact graphical models that enable efficient analysis.
- Concept 01: Introduction
- Concept 02: Quiz: Bayes Network
- Concept 03: Computing Bayes Rule
- Concept 04: Quiz: Two Test Cancer
- Concept 05: Quiz: Two Test Cancer 2
- Concept 06: Quiz: Conditional Independence
- Concept 07: Quiz: Conditional Independence 2
- Concept 08: Quiz: Absolute And Conditional
- Concept 09: Quiz: Confounding Cause
- Concept 10: Quiz: Explaining Away
- Concept 11: Quiz: Explaining Away 2
- Concept 12: Quiz: Explaining Away 3
- Concept 13: Conditional Dependence
- Concept 14: Lesson Plan: Week 10
- Concept 15: Quiz: General Bayes Net
- Concept 16: Quiz: General Bayes Net 2
- Concept 17: Quiz: General Bayes Net 3
- Concept 18: Value Of A Network
- Concept 19: Quiz: D Separation
- Concept 20: Quiz: D Separation 2
-
Lesson 03: Inference in Bayes Nets
In this lesson, you will learn about probabilistic inference using Bayes Nets, i.e. how to answer questions that you are interested in, given certain inputs.
- Concept 01: Probabilistic Inference
- Concept 02: Quiz: Overview and Example
- Concept 03: Quiz: Enumeration
- Concept 04: Quiz: Speeding Up Enumeration
- Concept 05: Quiz: Speeding Up Enumeration 2
- Concept 06: Quiz: Speeding Up Enumeration 3
- Concept 07: Quiz: Speeding Up Enumeration 4
- Concept 08: Causal Direction
- Concept 09: Quiz: Variable Elimination
- Concept 10: Quiz: Variable Elimination 2
- Concept 11: Quiz: Variable Elimination 3
- Concept 12: Variable Elimination 4
- Concept 13: Lesson Plan: Week 11
- Concept 14: Approximate Inference
- Concept 15: Quiz: Sampling Example
- Concept 16: Approximate Inference 2
- Concept 17: Rejection Sampling
- Concept 18: Quiz: Likelihood Weighting
- Concept 19: Likelihood Weighting 1
- Concept 20: Likelihood Weighting 2
- Concept 21: Gibbs Sampling
- Concept 22: Quiz: Monty Hall Problem
- Concept 23: Monty Hall Letter
-
Lesson 04: Hidden Markov Models
Learn to process sequences and time-series data using Hidden Markov Models.
- Concept 01: Pattern Recognition through Time
- Concept 02: Dolphin Whistles
- Concept 03: Problems Matching Dolphin Whistles
- Concept 04: Warping Time
- Concept 05: Euclidean Distance Not Sufficient
- Concept 06: Dynamic Time Warping
- Concept 07: Sakoe Chiba Bounds
- Concept 08: Readings on DTW
- Concept 09: Hidden Markov Models
- Concept 10: Lesson Plan: Week 12
- Concept 11: HMM Representation
- Concept 12: Sign Language Recognition
- Concept 13: Delta-y Quiz
- Concept 14: HMM: "I"
- Concept 15: HMM: "We"
- Concept 16: I vs We Quiz
- Concept 17: Viterbi Trellis: "I"
- Concept 18: "I" Transitions Quiz
- Concept 19: Viterbi Trellis: "I" (continued)
- Concept 20: Nodes for "I"
- Concept 21: Viterbi Path
- Concept 22: "We": Transitions Quiz
- Concept 23: "We": Transition Probabilities Quiz
- Concept 24: "We": Output Probabilities Quiz
- Concept 25: "We": Viterbi Path
- Concept 26: Which Gesture is Recognized?
- Concept 27: New Observation Sequence for "I"
- Concept 28: New Observation Sequence for "We"
- Concept 29: HMM Training
- Concept 30: Baum Welch
- Concept 31: Readings on HMMs
- Concept 32: Multidimensional Output Probabilities
- Concept 33: Using a Mixture of Gaussians
- Concept 34: HMM Topologies
- Concept 35: Phrase Level Recognition
- Concept 36: Stochastic Beam Search
- Concept 37: Context Training
- Concept 38: Statistical Grammar
- Concept 39: State Tying
- Concept 40: HMM Resources
- Concept 41: Segmentally Boosted HMMs
- Concept 42: SBHMM Resources
- Concept 43: Using HMMs to Generate Data
- Concept 44: HMMs for Speech Synthesis
-
Lesson 06: Wrapping Up Term 1
Links to lesson plans for weeks 13 & 14 of term 1.
-
Part 02 : Deep Learning and Applications
In this term, you'll learn the cutting edge advancements of AI and Deep Learning. You'll get the chance to apply Deep Learning on a variety of different topics including Computer Vision, Speech, and Natural Language Processing. We'll cover Convolutional Neural Networks, Recurrent Neural Networks, and other advanced models.
-
Module 01: Introduction to the Nanodegree
-
Lesson 01: Welcome to Artificial Intelligence
Welcome to Term 2 of the Artificial Intelligence Nanodegree program!
-
Lesson 02: Review Anaconda Setup
Review the instructions for installing & configuring Anaconda on your system.
-
Lesson 03: Cloud Computing Setup Instructions
Instructions for configuring cloud services with GPU access
-
Lesson 04: GPU Workspaces Demo
Introduce & demonstrate the functionality of GPU workspaces in the Udacity classroom
-
-
Module 02: Introduction to Deep Learning
-
Lesson 01: Deep Neural Networks
Luis will give you solid foundations on Deep Learning, and teach you how to apply Neural Networks to analyze real data!
- Concept 01: Introduction
- Concept 02: Classification Problems 1
- Concept 03: Classification Problems 2
- Concept 04: Linear Boundaries
- Concept 05: Higher Dimensions
- Concept 06: Perceptrons
- Concept 07: Why "Neural Networks"?
- Concept 08: Perceptrons as Logical Operators
- Concept 09: Perceptron Trick
- Concept 10: Perceptron Algorithm
- Concept 11: Non-Linear Regions
- Concept 12: Error Functions
- Concept 13: Log-loss Error Function
- Concept 14: Discrete vs Continuous
- Concept 15: Softmax
- Concept 16: One-Hot Encoding
- Concept 17: Maximum Likelihood
- Concept 18: Maximizing Probabilities
- Concept 19: Cross-Entropy 1
- Concept 20: Cross-Entropy 2
- Concept 21: Multi-Class Cross Entropy
- Concept 22: Logistic Regression
- Concept 23: Gradient Descent
- Concept 24: Perceptron vs Gradient Descent
- Concept 25: Continuous Perceptrons
- Concept 26: Non-linear Data
- Concept 27: Non-Linear Models
- Concept 28: Neural Network Architecture
- Concept 29: Feedforward
- Concept 30: Backpropagation
- Concept 31: Keras
- Concept 32: Mini Project: Students Admissions in Keras
- Concept 33: Lesson Plan: Week 2
- Concept 34: Training Optimization
- Concept 35: Batch vs Stochastic Gradient Descent
- Concept 36: Learning Rate Decay
- Concept 37: Testing
- Concept 38: Overfitting and Underfitting
- Concept 39: Early Stopping
- Concept 40: Regularization
- Concept 41: Regularization 2
- Concept 42: Dropout
- Concept 43: Vanishing Gradient
- Concept 44: Other Activation Functions
- Concept 45: Local Minima
- Concept 46: Random Restart
- Concept 47: Momentum
- Concept 48: Optimizers in Keras
- Concept 49: Error Functions Around the World
- Concept 50: Mini Project Intro
- Concept 51: Mini Project: IMDB Data in Keras
- Concept 52: Outro
-
-
Module 03: Convolutional Neural Networks
-
Lesson 01: Convolutional Neural Networks
Alexis explains the theory behind Convolutional Neural Networks and how they help us dramatically improve performance in image classification.
- Concept 01: Lesson Plan: Week 3
- Concept 02: Introducing Alexis
- Concept 03: Applications of CNNs
- Concept 04: How Computers Interpret Images
- Concept 05: MLPs for Image Classification
- Concept 06: Categorical Cross-Entropy
- Concept 07: Model Validation in Keras
- Concept 08: When do MLPs (not) work well?
- Concept 09: Mini Project: Training an MLP on MNIST
- Concept 10: Local Connectivity
- Concept 11: Convolutional Layers (Part 1)
- Concept 12: Convolutional Layers (Part 2)
- Concept 13: Stride and Padding
- Concept 14: Convolutional Layers in Keras
- Concept 15: Quiz: Dimensionality
- Concept 16: Pooling Layers
- Concept 17: Max Pooling Layers in Keras
- Concept 18: Lesson Plan: Week 4
- Concept 19: CNNs for Image Classification
- Concept 20: CNNs in Keras: Practical Example
- Concept 21: Mini Project: CNNs in Keras
- Concept 22: Image Augmentation in Keras
- Concept 23: Mini Project: Image Augmentation in Keras
- Concept 24: Groundbreaking CNN Architectures
- Concept 25: Visualizing CNNs (Part 1)
- Concept 26: Visualizing CNNs (Part 2)
- Concept 27: Transfer Learning
- Concept 28: Transfer Learning in Keras
-
Lesson 02: CNN Project: Dog Breed Classifier
In this project, you will learn how to build a pipeline to process real-world, user-supplied images. Given an image of a dog, your algorithm will identify an estimate of the canine’s breed.
-
-
Module 04: TensorFlow
-
Lesson 01: Intro to TensorFlow
In this section you'll get a hands-on introduction to deep learning and Tensorflow, Google's deep learning framework, and you'll be able to apply it on an image dataset.
- Concept 01: Intro
- Concept 02: Installing TensorFlow
- Concept 03: Hello, Tensor World!
- Concept 04: Quiz: TensorFlow Linear Function
- Concept 05: Quiz: TensorFlow Softmax
- Concept 06: Quiz: TensorFlow Cross Entropy
- Concept 07: Quiz: Mini-batch
- Concept 08: Epochs
- Concept 09: Lab: TensorFlow Neural Network
- Concept 10: Two-layer Neural Network
- Concept 11: Quiz: TensorFlow ReLUs
- Concept 12: Deep Neural Network in TensorFlow
- Concept 13: Save and Restore TensorFlow Models
- Concept 14: Finetuning
- Concept 15: Quiz: TensorFlow Dropout
-
-
Module 05: Autoencoders
-
Lesson 01: Autoencoders
Autoencoders are neural networks used for data compression, image denoising, and dimensionality reduction. In this lesson, Mat will teach how to build autoencoders using TensorFlow.
-
-
Module 06: Recurrent Neural Networks
-
Lesson 01: Recurrent Neural Networks
Jeremy explains Recurrent Neural Networks, and their cutting edge applications to text-based sequence generation
- Concept 01: Introducing Jeremy
- Concept 02: Section 1: Motivation for RNNs
- Concept 03: Motivation for RNNs
- Concept 04: Vanilla supervised learners and structured input
- Concept 05: Section 2: Motivating and Modelling Recursive Sequences
- Concept 06: Motivating and modeling recursive sequences
- Concept 07: Simple recursive examples
- Concept 08: Recursive or not? Part 1
- Concept 09: Recursive or not? Part 2
- Concept 10: Recursive or not? Part 3
- Concept 11: Ways of thinking about recursivity
- Concept 12: Driving a recursive sequence
- Concept 13: Section summary
- Concept 14: Section 3: Injecting recursivity into a learner (the lazy wa
- Concept 15: Injecting Recursivity into a Learner (the lazy way)
- Concept 16: A first example
- Concept 17: Setting up the example
- Concept 18: Windowing the example sequence
- Concept 19: Using Keras for fitting
- Concept 20: Using a regressor as a generative model
- Concept 21: A second example
- Concept 22: Setting up the second example
- Concept 23: Wrapping up the second example
- Concept 24: Interesting twists on the second example
- Concept 25: Real time series example
- Concept 26: Section summary
- Concept 27: Section 4: Injecting Recursivity into Learners the Smart Way
- Concept 28: Coding up a crazy recursive sequence
- Concept 29: Flaws with the FNN approach
- Concept 30: RNN fundamental derivations
- Concept 31: Formulating a Least Squares loss
- Concept 32: RNNs and memory
- Concept 33: RNNs and graphical models
- Concept 34: RNN Technical Issues
- Concept 35: Section and course summary
- Concept 36: Outro
-
Lesson 02: Long Short-Term Memory Networks (LSTM)
Luis explains Long Short-Term Memory Networks (LSTM), and similar architectures which have the benefits of preserving long term memory.
- Concept 01: Intro to LSTM
- Concept 02: RNN vs LSTM
- Concept 03: Basics of LSTM
- Concept 04: Architecture of LSTM
- Concept 05: The Learn Gate
- Concept 06: The Forget Gate
- Concept 07: The Remember Gate
- Concept 08: The Use Gate
- Concept 09: Putting it All Together
- Concept 10: Quiz
- Concept 11: Other architectures
- Concept 12: Outro LSTM
-
Lesson 03: Implementing RNNs and LSTMs
In this lesson, Mat will review the concepts of RNNs and LSTMs, and then you'll see how a character-wise recurrent network is implemented in TensorFlow.
- Concept 01: Intro
- Concept 02: Review of RNNs
- Concept 03: Review of LSTMs
- Concept 04: Character-wise RNNs
- Concept 05: Sequence Batching
- Concept 06: Character-wise RNN Notebook
- Concept 07: Implementing a Character-wise RNN
- Concept 08: Batching Data Solution
- Concept 09: LSTM Cell
- Concept 10: LSTM Cell Solution
- Concept 11: RNN Output
- Concept 12: Network Loss
- Concept 13: Output and Loss Solutions
- Concept 14: Build the Network
- Concept 15: Build the Network Solution
- Concept 16: RNN Resources
-
Lesson 04: Hyperparameters
In this section, Jay will teach you about some important hyperparameters used for our deep learning work, including those used for Recurrent Neural Networks.
- Concept 01: Introducing Jay
- Concept 02: Introduction
- Concept 03: Learning Rate
- Concept 04: Learning Rate
- Concept 05: Minibatch Size
- Concept 06: Number of Training Iterations / Epochs
- Concept 07: Number of Hidden Units / Layers
- Concept 08: RNN Hyperparameters
- Concept 09: RNN Hyperparameters
- Concept 10: Sources & References
-
Lesson 05: Sentiment Prediction with RNN
In this lesson you'll implement a sentiment prediction RNN
-
Lesson 06: RNN Project: Time Series Prediction and Text Generation
In this project you'll build RNNs that can generate sequences based on input data.
Project Description - Time Series Prediction and Text Generation
-
-
Module 07: Generative Adversarial Networks
-
Lesson 01: Generative Adversarial Networks
Ian Goodfellow, the inventor of GANs, introduces you to these exciting models. You'll also implement your own GAN on the MNIST dataset.
- Concept 01: Introducing Ian Goodfellow
- Concept 02: What can you do with GANs?
- Concept 03: How GANs work
- Concept 04: Games and Equilibria
- Concept 05: Practical tips and tricks for training GANs
- Concept 06: Build a GAN
- Concept 07: Get started with a GAN
- Concept 08: Generator Network
- Concept 09: Discriminator Network
- Concept 10: Generator and Discriminator Solutions
- Concept 11: Building the Network
- Concept 12: Building the Network Solution
- Concept 13: Training Losses
- Concept 14: Training Optimizers
- Concept 15: Training Losses and Optimizers Solution
- Concept 16: A Trained GAN
- Concept 17: Doing More With Your GAN
-
Lesson 02: Deep Convolutional GANs
In this lesson you'll implement a Deep Convolution GAN to generate complex color images of house numbers.
- Concept 01: Deep Convolutional GANs
- Concept 02: DCGAN Architecture
- Concept 03: Batch Normalization
- Concept 04: DCGAN Implementation
- Concept 05: DCGAN and the Generator
- Concept 06: Generator Solution
- Concept 07: Discriminator
- Concept 08: Discriminator Solution
- Concept 09: Building and Training the Network
- Concept 10: Hyperparameter Solutions
-
Lesson 03: Semisupervised Learning
Ian Goodfellow leads you through a semi-supervised GAN model, a classifier that can learn from mostly unlabeled data.
- Concept 01: Semi-supervised Learning
- Concept 02: Semi-Supervised Classification with GANs
- Concept 03: Introducing Semi-Supervised Learning
- Concept 04: Data Prep
- Concept 05: Building The Generator And Discriminator
- Concept 06: Model Loss Exercise
- Concept 07: Model Optimization Exercise
- Concept 08: Training The Network
- Concept 09: Discriminator Solution
- Concept 10: Model Loss Solution
- Concept 11: Model Optimizer Solution
- Concept 12: Trained Semi-Supervised GAN
-
-
Module 08: Concentrations
-
Lesson 01: Concentration Previews
In this final section of the nanodegree, you’ll choose a concentration in either Voice User Interfaces, Natural Language Processing, or Computer Vision.
-
Lesson 02: Intro to Computer Vision
Learn what computer vision is all about, its applications in the field of artificial and emotional intelligence.
- Concept 01: Welcome to Computer Vision
- Concept 02: What is Vision?
- Concept 03: Role in AI
- Concept 04: Computer Vision Applications
- Concept 05: Emotional Intelligence
- Concept 06: Vision-based Emotion AI
- Concept 07: Computer Vision Pipeline
- Concept 08: Quiz: Pipeline Steps
- Concept 09: Training a Model
- Concept 10: AffdexMe Demo
- Concept 11: Emotion as a Service
- Concept 12: [Preview] Project: Mimic Me!
-
Lesson 03: Intro to Natural Language Processing
Find out how Natural Language Processing is being used in the industry, why it is challenging, and learn to design an NLP solution using IBM Watson's cloud-based services.
- Concept 01: NLP Overview
- Concept 02: Structured Languages
- Concept 03: Grammar
- Concept 04: Unstructured Text
- Concept 05: Counting Words
- Concept 06: Context Is Everything
- Concept 07: NLP and IBM Watson
- Concept 08: Applications of NLP
- Concept 09: Challenges in NLP
- Concept 10: NLP Services
- Concept 11: Getting Started with Watson
- Concept 12: Deploying a Bluemix Application
- Concept 13: Towards Augmented Intelligence
- Concept 14: [Preview] Project: Bookworm
-
Lesson 04: Intro to Voice User Interfaces
Voice User Interfaces make interacting with machines more natural and less tedious. Learn how you can design and deploy your own VUI using Amazon's Alexa Skills Kit!
- Concept 01: Welcome to Voice User Interfaces!
- Concept 02: VUI Overview
- Concept 03: VUI Applications
- Concept 04: What is an Alexa Skill?
- Concept 05: Conversational AI with Alexa
- Concept 06: VUI Best Practices
- Concept 07: Lab: Space Geek
- Concept 08: Alexa Skills - Beyond Space Geek
- Concept 09: [Preview] Project: Alexa History Skill
-
Part 03 : Computer Vision
In this module, you will learn how to build intelligent systems that can see and understand the world using Computer Vision. You'll learn fundamental techniques for tasks like Object Recognition, Face Detection, Video Analysis, etc., and integrate classic methods with more modern Convolutional Neural Networks.
-
Module 01: Introduction to Computer Vision
-
Lesson 02: Intro to Computer Vision
Learn what computer vision is all about, its applications in the field of artificial and emotional intelligence.
- Concept 01: Welcome to Computer Vision
- Concept 02: What is Vision?
- Concept 03: Role in AI
- Concept 04: Computer Vision Applications
- Concept 05: Emotional Intelligence
- Concept 06: Vision-based Emotion AI
- Concept 07: Computer Vision Pipeline
- Concept 08: Quiz: Pipeline Steps
- Concept 09: Training a Model
- Concept 10: AffdexMe Demo
- Concept 11: Emotion as a Service
- Concept 12: [Preview] Project: Mimic Me!
-
Lesson 03: Mimic Me!
Learn to track faces in a video and identify facial expressions using Affectiva's Emotion-as-a-Service API!
-
Module 02: Computer Vision Fundamentals
-
Lesson 01: Image Representation and Analysis
In this section, you'll learn the fundamentals of computer vision: from how an image is formed to how to process, filter, and transform images!
- Concept 01: Intro to Image Processing
- Concept 02: Pre-Processing
- Concept 03: Quiz: Color or Grayscale?
- Concept 04: When Color IS Important
- Concept 05: Image Formation
- Concept 06: Images as Functions
- Concept 07: Quiz: Image Operations
- Concept 08: Color Thresholds
- Concept 09: Installing OpenCV
- Concept 10: Coding a Blue Screen
- Concept 11: Quiz: Color Threshold
- Concept 12: Color Spaces and Transforms
- Concept 13: Geometric Transforms
- Concept 14: Transforming Text
- Concept 15: Quiz: Warp the Perspective
- Concept 16: Filters Revisited!
- Concept 17: Frequency in Images
- Concept 18: High-pass Filters
- Concept 19: Quiz: Kernels
- Concept 20: Creating a Filter
- Concept 21: Gradients and Sobel Filters
- Concept 22: Quiz: Code Your Own Filter
- Concept 23: Low-pass Filters
- Concept 24: Gaussian Blur
- Concept 25: Canny Edge Detector
- Concept 26: Review
-
Lesson 02: Image Segmentation
In this section, you'll learn how to break an image up into segments and areas of interest using a variety of different algorithms!
- Concept 01: Image Segmentation
- Concept 02: Image Contours
- Concept 03: Quiz: Contour Features
- Concept 04: Hough Transform
- Concept 05: Quiz: Hough Space
- Concept 06: Hough Line Detection
- Concept 07: Quiz: Detect Lane Lines
- Concept 08: K-means Clustering
- Concept 09: K-means Implementation
- Concept 10: CNN's in Image Segmentation
- Concept 11: Review the CV Pipeline
-
Lesson 03: Features and Object Recognition
Here, we discuss the end goal of many computer vision applications: feature extraction and object recognition. We'll cover why these are important applications and how to code them!
- Concept 01: Features and Object Recognition
- Concept 02: Why Use Features?
- Concept 03: Quiz: Select a Feature
- Concept 04: Types of Features
- Concept 05: Corner Detectors
- Concept 06: Dilation and Erosion
- Concept 07: Quiz: Find the Corners
- Concept 08: Feature Vectors
- Concept 09: HOG
- Concept 10: Implementing HOG
- Concept 11: Quiz: Histogram Bins
- Concept 12: Object Recognition
- Concept 13: Train a Classifier
- Concept 14: SVM Classifier
- Concept 15: Haar Cascades
- Concept 16: Face Detection with OpenCV
- Concept 17: Motion
- Concept 18: Optical Flow
- Concept 19: Object Tracking
- Concept 20: Outro
-
Lesson 04: CV Capstone: Facial Keypoint Detection
You'll apply what you've learned about the computer vision pipeline and build an end-to-end facial keypoint recognition system!
-
-
Module 03: Completing the Program
-
Lesson 01: Completing the Program
Congratulations! You've reached the end of the Artificial Intelligence Nanodegree program! Read on to learn how to officially complete the program and graduate.
-
Part 04 : Natural Language Processing
In this module, you will build end-to-end Natural Language Processing pipelines, starting from text processing, to feature extraction and modeling for different tasks such as Sentiment Analysis, Spam Detection and Machine Translation. You'll also learn how to design Recurrent Neural Networks for challenging NLP applications.
-
Module 01: Introduction to Natural Language Processing
-
Lesson 02: Intro to Natural Language Processing
Find out how Natural Language Processing is being used in the industry, why it is challenging, and learn to design an NLP solution using IBM Watson's cloud-based services.
- Concept 01: NLP Overview
- Concept 02: Structured Languages
- Concept 03: Grammar
- Concept 04: Unstructured Text
- Concept 05: Counting Words
- Concept 06: Context Is Everything
- Concept 07: NLP and IBM Watson
- Concept 08: Applications of NLP
- Concept 09: Challenges in NLP
- Concept 10: NLP Services
- Concept 11: Getting Started with Watson
- Concept 12: Deploying a Bluemix Application
- Concept 13: Towards Augmented Intelligence
- Concept 14: [Preview] Project: Bookworm
-
Lesson 03: Bookworm
Learn how to build a simple question-answering agent using IBM Watson.
-
Module 02: NLP Fundamentals
-
Lesson 01: Natural Language Processing
An overview of how to build an end-to-end Natural Language Processing pipeline.
-
Lesson 02: Text Processing
Learn to prepare text obtained from different sources for further processing, by cleaning, normalizing and splitting it into individual words or tokens.
- Concept 01: Text Processing
- Concept 02: Coding Exercises
- Concept 03: Capturing Text Data
- Concept 04: Quiz: Read Text Files
- Concept 05: Cleaning
- Concept 06: Normalization
- Concept 07: Tokenization
- Concept 08: Quiz: Split Sentences
- Concept 09: Stop Word Removal
- Concept 10: Part-of-Speech Tagging
- Concept 11: Named Entity Recognition
- Concept 12: Stemming and Lemmatization
- Concept 13: Summary
-
Lesson 03: Feature Extraction
Transform text using methods like Bag-of-Words, TF-IDF, Word2Vec and GloVe to extract features that you can use in machine learning models.
-
Lesson 04: Modeling
A selection of different NLP tasks and how to build models that accomplish them.
-
Lesson 05: Machine Translation
Apply the skills you've learnt in Natural Language Processing to the challenging and extremely rewarding task of Machine Translation. Bonne chance!
-
-
Module 03: NLP: Supplementary
-
Lesson 01: Embeddings and Word2Vec
In this lesson, you'll learn about embeddings in neural networks by implementing the word2vec model.
- Concept 01: Additional NLP Lessons
- Concept 02: Embeddings Intro
- Concept 03: Implementing Word2Vec
- Concept 04: Subsampling Solution
- Concept 05: Making Batches
- Concept 06: Batches Solution
- Concept 07: Building the Network
- Concept 08: Negative Sampling
- Concept 09: Building the Network Solution
- Concept 10: Training Results
-
Lesson 02: Sequence to Sequence
Here you'll learn about a specific architecture of RNNs for generating one sequence from another sequence. These RNNs are useful for chatbots, machine translation, and more!
- Concept 01: Introducing Jay Alammar
- Concept 02: Jay Introduction
- Concept 03: Applications
- Concept 04: Architectures
- Concept 05: Architectures in More Depth
- Concept 06: Preprocessing
- Concept 07: Sequence to sequence in TensorFlow
- Concept 08: Inputs
- Concept 09: Further Reading
- Concept 10: Sequence to Sequence in TensorFlow
-
-
Module 04: Completing the Program
-
Lesson 01: Completing the Program
Congratulations! You've reached the end of the Artificial Intelligence Nanodegree program! Read on to learn how to officially complete the program and graduate.
-
Part 05 : Voice User Interfaces
This module will help you get started in the exciting and fast-growing area of designing Voice User Interfaces! You'll learn how to build Conversational Agents for products and services more natural to interact with. You will also dive deeper into the core challenge of Speech Recognition, applying Recurrent Neural Networks to solve it.
-
Module 01: Introduction to Voice User Interfaces
-
Lesson 03: Alexa History Skill
Create your own Alexa History Skill!
-
Module 02: Speech Recognition
-
Lesson 01: Introduction to Speech Recognition
Dive deeper into the exciting field of Speech Recognition, including cutting edge deep learning technologies for Automatic Speech Recognition (ASR).
- Concept 01: Intro
- Concept 02: Challenges in ASR
- Concept 03: Signal Analysis
- Concept 04: References: Signal Analysis
- Concept 05: Quiz: FFT
- Concept 06: Feature Extraction with MFCC
- Concept 07: References: Feature Extraction
- Concept 08: Quiz: MFCC
- Concept 09: Phonetics
- Concept 10: References: Phonetics
- Concept 11: Quiz: Phonetics
- Concept 12: Voice Data Lab Introduction
- Concept 13: Lab: Voice Data
- Concept 14: Acoustic Models and the Trouble with Time
- Concept 15: HMMs in Speech Recognition
- Concept 16: Language Models
- Concept 17: N-Grams
- Concept 18: Quiz: N-Grams
- Concept 19: References: Traditional ASR
- Concept 20: A New Paradigm
- Concept 21: Deep Neural Networks as Speech Models
- Concept 22: Connectionist Tempora Classification (CTC)
- Concept 23: References: Deep Neural Network ASR
- Concept 24: Outro
-
Lesson 02: DNN Speech Recognizer
Build an Automatic Speech Recognizer using Deep Learning RNN's
-
-
Module 03: Completing the Program
-
Lesson 01: Completing the Program
Congratulations! You've reached the end of the Artificial Intelligence Nanodegree program! Read on to learn how to officially complete the program and graduate.
-
Part 06 (Career): Career: Job Search Strategies
Opportunity can come when you least expect it, so when your dream job comes along, you want to be ready. In the following lessons, you will learn strategies for conducting a successful job search, including developing a targeted resume and cover letter for that job.
-
Module 01: Conduct a Job Search
-
Lesson 01: Conduct a Job Search
Learn how to search for jobs effectively through industry research, and targeting your application to a specific role.
-
-
Module 02: Refine Your Resume
-
Lesson 01: Refine Your Entry-Level Resume
Receive a personalized review of your resume. This resume review is best suited for applicants who have 0-3 years of work experience in any industry.
-
Lesson 02: Refine Your Career Change Resume
Receive a personalized review of your resume. This resume review is best suited for applicants who have 3+ years of work experience in an unrelated field.
-
Lesson 03: Refine Your Prior Industry Experience Resume
Receive a personalized review of your resume. This resume review is best suited for applicants who have 3+ years of work experience in a related field.
Project Description - Resume Review Project (Prior Industry Experience)
Project Rubric - Resume Review Project (Prior Industry Experience)
- Concept 01: Convey Your Skills Concisely
- Concept 02: Effective Resume Components
- Concept 03: Resume Structure
- Concept 04: Describe Your Work Experiences
- Concept 05: Resume Reflection
- Concept 06: Resume Review
- Concept 07: Resume Review (Prior Industry Experience)
- Concept 08: Resources in Your Career Portal
-
-
Module 03: Write an Effective Cover Letter
-
Lesson 01: Craft Your Cover Letter
Get a personalized review of your cover letter. A successful cover letter will convey your enthusiasm, specific technical qualifications, and communication skills applicable to the position.
-
Part 07 (Career): Career: Networking
Networking is a very important component to a successful job search. In the following lesson, you will learn how tell your unique story to recruiters in a succinct and professional but relatable way.
-
Module 01: Develop Your Personal Brand
-
Lesson 01: Develop Your Personal Brand
In this lesson, learn how to tell your unique story in a succinct and professional way. Communicate to employers that you know how to solve problems, overcome challenges, and achieve results.
-
Lesson 02: LinkedIn Review
Optimize your LinkedIn profile to show up in recruiter searches, build your network, and attract employers. Learn to read your LinkedIn profile through the lens of a recruiter or hiring manager.
-
Lesson 03: Udacity Professional Profile
Update and personalize your Udacity Professional Profile as you complete your Nanodegree program, and make your Profile visible to Udacity hiring partners when you’re ready to start your job search.
-
-
Module 02: GitHub Profile Review
-
Lesson 01: GitHub Review
Review how your GitHub profile, projects, and code represent you as a potential job candidate. Learn to assess your GitHub profile through the eyes of a recruiter or hiring manager.
- Concept 01: Introduction
- Concept 02: GitHub profile important items
- Concept 03: Good GitHub repository
- Concept 04: Interview with Art - Part 1
- Concept 05: Identify fixes for example “bad” profile
- Concept 06: Quick Fixes #1
- Concept 07: Quick Fixes #2
- Concept 08: Writing READMEs with Walter
- Concept 09: Interview with Art - Part 2
- Concept 10: Commit messages best practices
- Concept 11: Reflect on your commit messages
- Concept 12: Participating in open source projects
- Concept 13: Interview with Art - Part 3
- Concept 14: Participating in open source projects 2
- Concept 15: Starring interesting repositories
- Concept 16: Outro
- Concept 17: Resources in Your Career Portal
-
Part 08 (Elective): CVND
-
Module 01: Intro to CVND
-
Lesson 01: Image Representation & Classification
Learn how images are represented numerically and implement image processing techniques, such as color masking and binary classification.
- Concept 01: Intro to Pattern Recognition
- Concept 02: Emotional Intelligence
- Concept 03: Computer Vision Pipeline
- Concept 04: Training a Model
- Concept 05: Separating Data
- Concept 06: AffdexMe Demo
- Concept 07: Image Formation
- Concept 08: Images as Grids of Pixels
- Concept 09: Notebook: Images as Numerical Data
- Concept 10: Color Images
- Concept 11: Color or Grayscale?
- Concept 12: Notebook: Visualizing RGB Channels
- Concept 13: Color Thresholds
- Concept 14: Coding a Blue Screen
- Concept 15: Notebook: Blue Screen
- Concept 16: Notebook: Green Screen
- Concept 17: Color Spaces and Transforms
- Concept 18: Notebook: Color Conversion
- Concept 19: Day and Night Classification Challenge
- Concept 20: Notebook: Load and Visualize the Data
- Concept 21: Labeled Data and Accuracy
- Concept 22: Distinguishing Traits
- Concept 23: Features
- Concept 24: Standardizing Output
- Concept 25: Notebook: Standardizing Day and Night Images
- Concept 26: Average Brightness
- Concept 27: Notebook: Average Brightness Feature Extraction
- Concept 28: Classification
- Concept 29: Notebook: Classification
- Concept 30: Evaluation Metrics
- Concept 31: Notebook: Accuracy and Misclassification
- Concept 32: Review and the Computer Vision Pipeline
-
Lesson 02: Convolutional Filters and Edge Detection
Learn about frequency in images and implement your own image filters for detecting edges and shapes in an image. Use a computer vision library to perform face detection.
- Concept 01: Filters and Finding Edges
- Concept 02: Frequency in Images
- Concept 03: Notebook: Fourier Transforms
- Concept 04: Quiz: Fourier Tranform Image
- Concept 05: High-pass Filters
- Concept 06: Quiz: Kernels
- Concept 07: Creating a Filter
- Concept 08: Gradients and Sobel Filters
- Concept 09: Notebook: Finding Edges
- Concept 10: Low-pass Filters
- Concept 11: Gaussian Blur
- Concept 12: Notebook: Gaussian Blur
- Concept 13: Notebook: Fourier Transforms of Filters
- Concept 14: Convolutional Layer
- Concept 15: Canny Edge Detector
- Concept 16: Notebook: Canny Edge Detection
- Concept 17: Shape Detection
- Concept 18: Hough Transform
- Concept 19: Quiz: Hough Space
- Concept 20: Hough Line Detection
- Concept 21: Notebook: Hough Detections
- Concept 22: Object Recognition & Introducing Haar Cascades
- Concept 23: Haar Cascades
- Concept 24: Notebook: Haar Cascade Face Detection
- Concept 25: Face Recognition and the Dangers of Bias
- Concept 26: Beyond Edges, Selecting Different Features
-
Lesson 03: Types of Features & Image Segmentation
Program a corner detector and learn techniques, like k-means clustering, for segmenting an image into unique parts.
- Concept 01: Types of Features
- Concept 02: Corner Detectors
- Concept 03: Notebook: Find the Corners
- Concept 04: Dilation and Erosion
- Concept 05: Image Segmentation
- Concept 06: Image Contours
- Concept 07: Notebook: Find Contours and Features
- Concept 08: Solution: Find Contours and Features
- Concept 09: K-means Clustering
- Concept 10: K-means Implementation
- Concept 11: Notebook: K-means Clustering
-
Lesson 04: Feature Vectors
Learn how to describe objects and images using feature vectors.
- Concept 01: Corners and Object Detection
- Concept 02: Feature Vectors
- Concept 03: Real-Time Feature Detection
- Concept 04: Introduction to ORB
- Concept 05: FAST
- Concept 06: Quiz: FAST Keypoints
- Concept 07: BRIEF
- Concept 08: Scale and Rotation Invariance
- Concept 09: Notebook: Image Pyramids
- Concept 10: Feature Matching
- Concept 11: ORB in Video
- Concept 12: Notebook: Implementing ORB
- Concept 13: HOG
- Concept 14: Notebook: Implementing HOG
- Concept 15: Learning to Find Features
-
Lesson 05: CNN Layers and Feature Visualization
Define and train your own convolution neural network for clothing recognition. Use feature visualization techniques to see what a network has learned.
- Concept 01: Introduction to CNN Layers
- Concept 02: Review: Training a Neural Network
- Concept 03: Lesson Outline and Data
- Concept 04: CNN Architecture, VGG-16
- Concept 05: Convolutional Layers
- Concept 06: Defining Layers in PyTorch
- Concept 07: Notebook: Visualizing a Convolutional Layer
- Concept 08: Pooling, VGG-16 Architecture
- Concept 09: Pooling Layers
- Concept 10: Notebook: Visualizing a Pooling Layer
- Concept 11: Fully-Connected Layers, VGG-16
- Concept 12: Notebook: Visualizing FashionMNIST
- Concept 13: Training in PyTorch
- Concept 14: Notebook: Fashion MNIST Training Exercise
- Concept 15: Notebook: FashionMNIST, Solution 1
- Concept 16: Review: Dropout
- Concept 17: Notebook: FashionMNIST, Solution 2
- Concept 18: Network Structure
- Concept 19: Feature Visualization
- Concept 20: Feature Maps
- Concept 21: First Convolutional Layer
- Concept 22: Visualizing CNNs (Part 2)
- Concept 23: Visualizing Activations
- Concept 24: Notebook: Feature Viz for FashionMNIST
- Concept 25: Notebook: Visualize Your Net Layers
- Concept 26: Last Feature Vector and t-SNE
- Concept 27: Occlusion, Saliency, and Guided Backpropagation
- Concept 28: Summary of Feature Viz
- Concept 29: Image Classification & Regression Challenges
-
-
Module 02: Advanced CV & Deep Learning
-
Lesson 01: Advanced CNN Architectures
Learn about advances in CNN architectures and see how region-based CNN’s, like Faster R-CNN, have allowed for fast, localized object recognition in images.
- Concept 01: CNN's and Scene Understanding
- Concept 02: More than Classification
- Concept 03: Classification and Localization
- Concept 04: Bounding Boxes and Regression
- Concept 05: Quiz: Loss Values
- Concept 06: Region Proposals
- Concept 07: R-CNN
- Concept 08: Fast R-CNN
- Concept 09: Faster R-CNN
- Concept 10: Detection With and Without Proposals
-
Lesson 02: YOLO
Learn about the YOLO (You Only Look Once) multi-object detection model and work with a YOLO implementation.
- Concept 01: Introduction to YOLO
- Concept 02: YOLO Output
- Concept 03: Sliding Windows, Revisited
- Concept 04: CNN & Sliding Windows
- Concept 05: Using a Grid
- Concept 06: Training on a Grid
- Concept 07: Generating Bounding Boxes
- Concept 08: Quiz: Generating Boxes and Detecting Objects
- Concept 09: Too Many Boxes
- Concept 10: Intersection over Union (IoU)
- Concept 11: Quiz: IoU and Overlap Limits
- Concept 12: Non-Maximal Suppression
- Concept 13: Anchor Boxes
- Concept 14: YOLO Algorithm
- Concept 15: Notebook: YOLO Implementation
-
Lesson 03: RNN's
Explore how memory can be incorporated into a deep learning model using recurrent neural networks (RNNs). Learn how RNNs can learn from and generate ordered sequences of data.
- Concept 01: RNN's in Computer Vision
- Concept 02: RNN Introduction
- Concept 03: RNN History
- Concept 04: RNN Applications
- Concept 05: Feedforward Neural Network-Reminder
- Concept 06: The Feedforward Process
- Concept 07: Feedforward Quiz
- Concept 08: Backpropagation- Theory
- Concept 09: Backpropagation - Example (part a)
- Concept 10: Backpropagation- Example (part b)
- Concept 11: Backpropagation Quiz
- Concept 12: RNN (part a)
- Concept 13: RNN (part b)
- Concept 14: RNN- Unfolded Model
- Concept 15: Unfolded Model Quiz
- Concept 16: RNN- Example
- Concept 17: Backpropagation Through Time (part a)
- Concept 18: Backpropagation Through Time (part b)
- Concept 19: Backpropagation Through Time (part c)
- Concept 20: BPTT Quiz 1
- Concept 21: BPTT Quiz 2
- Concept 22: BPTT Quiz 3
- Concept 23: Some more math
- Concept 24: RNN Summary
- Concept 25: From RNN to LSTM
- Concept 26: Wrap Up
-
Lesson 04: Long Short-Term Memory Networks (LSTMs)
Luis explains Long Short-Term Memory Networks (LSTM), and similar architectures which have the benefits of preserving long term memory.
- Concept 01: Intro to LSTM
- Concept 02: RNN vs LSTM
- Concept 03: Basics of LSTM
- Concept 04: Architecture of LSTM
- Concept 05: Notebook: LSTM Structure and Hidden State, PyTorch
- Concept 06: The Learn Gate
- Concept 07: The Forget Gate
- Concept 08: The Remember Gate
- Concept 09: The Use Gate
- Concept 10: Putting it All Together
- Concept 11: Quiz
- Concept 12: Notebook: LSTM for Part of Speech Tagging
- Concept 13: Character-Level RNN
- Concept 14: Sequence Batching
- Concept 15: Notebook: Character-Level LSTM
- Concept 16: Other architectures
-
Lesson 05: Hyperparameters
Learn about a number of different hyperparameters that are used in defining and training deep learning models. We'll discuss starting values and intuitions for tuning each hyperparameter.
- Concept 01: Introducing Jay
- Concept 02: Introduction
- Concept 03: Learning Rate
- Concept 04: Learning Rate
- Concept 05: Minibatch Size
- Concept 06: Number of Training Iterations / Epochs
- Concept 07: Number of Hidden Units / Layers
- Concept 08: RNN Hyperparameters
- Concept 09: RNN Hyperparameters
- Concept 10: Sources & References
-
Lesson 06: Optional: Attention Mechanisms
Attention is one of the most important recent innovations in deep learning. In this section, you'll learn how attention models work and go over a basic code implementation.
- Concept 01: Introduction to Attention
- Concept 02: Encoders and Decoders
- Concept 03: Elective: Text Sentiment Analysis
- Concept 04: Sequence to Sequence Recap
- Concept 05: Encoding -- Attention Overview
- Concept 06: Decoding -- Attention Overview
- Concept 07: Attention Overview
- Concept 08: Attention Encoder
- Concept 09: Attention Decoder
- Concept 10: Attention Encoder & Decoder
- Concept 11: Bahdanau and Luong Attention
- Concept 12: Multiplicative Attention
- Concept 13: Additive Attention
- Concept 14: Additive and Multiplicative Attention
- Concept 15: Computer Vision Applications
- Concept 16: Other Attention Methods
- Concept 17: The Transformer and Self-Attention
- Concept 18: Notebook: Attention Basics
- Concept 19: [SOLUTION]: Attention Basics
- Concept 20: Outro
-
Lesson 07: Image Captioning
Learn how to combine CNNs and RNNs to build a complex, automatic image captioning model.
- Concept 01: Introduction to Image Captioning
- Concept 02: Leveraging Neural Networks
- Concept 03: Captions and the COCO Dataset
- Concept 04: Visualize the Dataset
- Concept 05: CNN-RNN Model
- Concept 06: The Glue, Feature Vector
- Concept 07: Tokenizing Captions
- Concept 08: Tokenizing Words
- Concept 09: RNN Training
- Concept 10: Video Captioning
- Concept 11: On to the Project!
-
Lesson 08: Optional: Cloud Computing with AWS
Take advantage of Amazon's GPUs to train your neural network faster. In this lesson, you'll learn how to setup an instance on AWS and train a neural network on a GPU.
-
-
Module 03: Object Tracking & Localization
-
Lesson 01: Introduction to Motion
This lesson introduces a way to represent motion mathematically, outlines what you'll learn in this section, and introduces optical flow.
-
Lesson 02: Robot Localization
Learn to implement a Bayesian filter to locate a robot in space and represent uncertainty in robot motion.
- Concept 01: Probability Review
- Concept 02: Uncertainty and Bayes' Rule
- Concept 03: Reducing Uncertainty
- Concept 04: Probability Distributions
- Concept 05: Localization
- Concept 06: Total Probability
- Concept 07: Notebook: 1D Robot World
- Concept 08: Probability After Sense
- Concept 09: Notebook: Probability After Sense
- Concept 10: Normalize Distribution
- Concept 11: Sense Function
- Concept 12: Notebook: Sense Function
- Concept 13: Answer: Sense Function
- Concept 14: Normalized Sense Function
- Concept 15: Notebook: Normalized Sense Function
- Concept 16: Answer: Normalized Sense Function
- Concept 17: Test Sense Function
- Concept 18: Multiple Measurements
- Concept 19: Notebook: Multiple Measurements
- Concept 20: Answer: Multiple Measurements
- Concept 21: Exact Motion
- Concept 22: Move Function
- Concept 23: Notebook: Move Function
- Concept 24: Answer: Move Function
- Concept 25: Inexact Motion
- Concept 26: Inexact Move Function
- Concept 27: Notebook: Inexact Move Function
- Concept 28: Answer: Inexact Move Function
- Concept 29: Limit Distribution
- Concept 30: Move Twice
- Concept 31: Move 1000
- Concept 32: Notebook: Multiple Moves
- Concept 33: Sense and Move
- Concept 34: Notebook: Sense and Move Cycle
- Concept 35: Answer: Sense and Move
- Concept 36: Sense and Move 2
- Concept 37: Localization Summary
- Concept 38: C++ Elective & Implementation
-
Lesson 03: Mini-project: 2D Histogram Filter
Write sense and move functions (and debug) a 2D histogram filter!
-
Lesson 04: Introduction to Kalman Filters
Learn the intuition behind the Kalman Filter, a vehicle tracking algorithm, and implement a one-dimensional tracker of your own.
- Concept 01: Kalman Filters and Linear Algebra
- Concept 02: Introduction
- Concept 03: Tracking Intro
- Concept 04: Answer: Tracking Intro
- Concept 05: Gaussian Intro
- Concept 06: Answer: Gaussian Intro
- Concept 07: Quiz: Variance and Preferred Gaussian
- Concept 08: Answer: Variance and Preferred Gaussian
- Concept 09: Gaussian Function and Maximum
- Concept 10: Quiz: Shifting the Mean
- Concept 11: Answer: Shifting the Mean
- Concept 12: Quiz: Predicting the Peak
- Concept 13: Answer: Predicting the Peak
- Concept 14: Quiz: Parameter Update
- Concept 15: Answer: Parameter Update
- Concept 16: Notebook: New Mean and Variance
- Concept 17: Solution: New Mean and Variance
- Concept 18: Quiz: Gaussian Motion
- Concept 19: Answer: Gaussian Motion
- Concept 20: Predict Function
- Concept 21: Notebook: Predict Function
- Concept 22: Answer: Predict Function
- Concept 23: Kalman Filter Code
- Concept 24: Notebook: 1D Kalman Filter
- Concept 25: Answer: 1D Kalman Filter
- Concept 26: Kalman Prediction
- Concept 27: Next: Motion Models and State
-
Lesson 05: Representing State and Motion
Learn about representing the state of a car in a vector that can be modified using linear algebra.
- Concept 01: Localization Steps
- Concept 02: Intro to State
- Concept 03: Motion Models
- Concept 04: Quiz: Predicting State
- Concept 05: A Different Model
- Concept 06: Kinematics
- Concept 07: Quantifying State
- Concept 08: Lesson Outline
- Concept 09: Always Moving
- Concept 10: Car Object
- Concept 11: Interacting with a Car Object
- Concept 12: Look at the Class Code
- Concept 13: Turn Right
- Concept 14: Adding Color
- Concept 15: Instantiate Multiple Cars
- Concept 16: Color Class
- Concept 17: Overloading Functions
- Concept 18: Overloading Color Addition
- Concept 19: State Vector
- Concept 20: State Transformation Matrix
- Concept 21: Matrix Multiplication
- Concept 22: 1D State Vector and More Multiplication
- Concept 23: Modify Predict State
- Concept 24: Working with Matrices
-
Lesson 06: Matrices and Transformation of State
Linear Algebra is a rich branch of math and a useful tool. In this lesson you'll learn about the matrix operations that underly multidimensional Kalman Filters.
- Concept 01: Kalman Filter Land
- Concept 02: Kalman Filter Prediction
- Concept 03: Another Prediction
- Concept 04: More Kalman FIlters
- Concept 05: A Note on Notation
- Concept 06: Kalman Filter Design
- Concept 07: Let's Look at Where We Are
- Concept 08: The Kalman Filter Equations
- Concept 09: Simplifying the Kalman Filter Equations
- Concept 10: The Rest of the Lesson
- Concept 11: Representing State with Matrices
- Concept 12: Kalman Equation Reference
- Concept 13: What is a vector?
- Concept 14: Vectors in Python
- Concept 15: Coding Vectors
- Concept 16: Coding Vectors (solution)
- Concept 17: Guide to Mathematical Notation
- Concept 18: Matrices in Python
- Concept 19: Coding Matrices
- Concept 20: Coding Matrices (Solution)
- Concept 21: Matrix Addition
- Concept 22: Coding Matrix Addition
- Concept 23: Matrix Multiplication
- Concept 24: Coding Matrix Multiplication
- Concept 25: Transpose of a Matrix
- Concept 26: Coding the Transpose
- Concept 27: The Identity Matrix
- Concept 28: Coding Identity Matrix
- Concept 29: Matrix Inverse
- Concept 30: Coding Matrix Inverse
- Concept 31: What to Take Away from this Lesson
-
Lesson 07: Simultaneous Localization and Mapping
Learn how to implement SLAM: simultaneously localize an autonomous vehicle and create a map of landmarks in an environment.
- Concept 01: Introduction to SLAM
- Concept 02: Quiz: Graph SLAM
- Concept 03: Answer: Graph SLAM
- Concept 04: Quiz: Implementing Constraints
- Concept 05: Answer: Implementing Constraints
- Concept 06: Quiz: Adding Landmarks
- Concept 07: Answer: Adding Landmarks
- Concept 08: Quiz: Matrix Modification
- Concept 09: Answer: Matrix Modification
- Concept 10: Quiz: Untouched Fields
- Concept 11: Answer: Untouched Fields
- Concept 12: Quiz: Omega and Xi
- Concept 13: Notebook: Omega and Xi
- Concept 14: Quiz: Landmark Position
- Concept 15: Answer: Landmark Position
- Concept 16: Notebook: Including Sensor Measurements
- Concept 17: Quiz: Introducing Noise
- Concept 18: Answer: Introducing Noise
- Concept 19: Confident Measurements
- Concept 20: Notebook: Confident Measurements
- Concept 21: SLAM Summary
-
Lesson 08: Optional: Vehicle Motion and Calculus
Review the basics of calculus and see how to derive the x and y components of a self-driving car's motion from sensor measurements and other data.
- Concept 01: Introduction to Odometry
- Concept 02: Inertial Navigation Sensors
- Concept 03: Plotting Position vs. Time
- Concept 04: Interpreting Position vs. Time Graphs
- Concept 05: A "Typical" Calculus Problem
- Concept 06: How Odometers Work
- Concept 07: Speed from Position Data
- Concept 08: Position, Velocity, and Acceleration
- Concept 09: Implement an Accelerometer
- Concept 10: Differentiation Recap
- Concept 11: Acceleration Basics
- Concept 12: Plotting Elevator Acceleration
- Concept 13: Reasoning About Two Peaks
- Concept 14: The Integral: Area Under a Curve
- Concept 15: Approximating the Integral
- Concept 16: Approximating Integrals with Code
- Concept 17: Integrating Accelerometer Data
- Concept 18: Rate Gyros
- Concept 19: Integrating Rate Gyro Data
- Concept 20: Working with Real Data
- Concept 21: Accumulating Errors
- Concept 22: Sensor Strengths and Weaknesses
- Concept 23: Summary and Back to Trigonometry
- Concept 24: Trigonometry and Vehicle Motion
- Concept 25: Solving Trig Problems
- Concept 26: Keeping Track of x and y
- Concept 27: Keeping Track of x and y (solution)
- Concept 28: Conclusion
- Concept 29: Project Overview
- Concept 30: Lab - Reconstructing Trajectories
-
-
Module 04: Applications of Computer Vision & Deep Learning
-
Lesson 01: Applying Deep Learning Models
Try out a few really cool applications of computer vision and deep learning, such as style transfer, using pre-trained models that others have generously provided on Github.
-
-
Module 05: Review: Training a Neural Network
-
Lesson 01: Feedforward and Backpropagation
Short introduction to neural networks: how they train by doing a feedforward pass then performing backpropagation.
-
Lesson 02: Training Neural Networks
Now that you know what neural networks are, in this lesson you will learn several techniques to improve their training.
- Concept 01: Instructor
- Concept 02: Training Optimization
- Concept 03: Testing
- Concept 04: Overfitting and Underfitting
- Concept 05: Early Stopping
- Concept 06: Regularization
- Concept 07: Regularization 2
- Concept 08: Dropout
- Concept 09: Local Minima
- Concept 10: Random Restart
- Concept 11: Vanishing Gradient
- Concept 12: Other Activation Functions
- Concept 13: Batch vs Stochastic Gradient Descent
- Concept 14: Learning Rate Decay
- Concept 15: Momentum
- Concept 16: Error Functions Around the World
-
Lesson 03: Deep Learning with PyTorch
Learn how to use PyTorch for building deep learning models
- Concept 01: Instructor
- Concept 02: Introducing PyTorch
- Concept 03: PyTorch Tensors
- Concept 04: Defining Networks
- Concept 05: Training Networks
- Concept 06: Fashion-MNIST Exercise
- Concept 07: Inference & Validation
- Concept 08: Saving and Loading Trained Networks
- Concept 09: Loading Data Sets with Torchvision
- Concept 10: Transfer Learning
- Concept 11: Transfer Learning Solution
-
-
Module 06: Skin Cancer Detection
-
Lesson 01: Deep Learning for Cancer Detection with Sebastian Thrun
Sebastian Thrun teaches us about his groundbreaking work detecting skin cancer with convolutional neural networks.
- Concept 01: Intro
- Concept 02: Skin Cancer
- Concept 03: Survival Probability of Skin Cancer
- Concept 04: Medical Classification
- Concept 05: The data
- Concept 06: Image Challenges
- Concept 07: Quiz: Data Challenges
- Concept 08: Solution: Data Challenges
- Concept 09: Training the Neural Network
- Concept 10: Quiz: Random vs Pre-initialized Weights
- Concept 11: Solution: Random vs Pre-initialized Weight
- Concept 12: Validating the Training
- Concept 13: Quiz: Sensitivity and Specificity
- Concept 14: Solution: Sensitivity and Specificity
- Concept 15: More on Sensitivity and Specificity
- Concept 16: Quiz: Diagnosing Cancer
- Concept 17: Solution: Diagnosing Cancer
- Concept 18: Refresh on ROC Curves
- Concept 19: Quiz: ROC Curve
- Concept 20: Solution: ROC Curve
- Concept 21: Comparing our Results with Doctors
- Concept 22: Visualization
- Concept 23: What is the network looking at?
- Concept 24: Refresh on Confusion Matrices
- Concept 25: Confusion Matrix
- Concept 26: Conclusion
- Concept 27: Useful Resources
- Concept 28: Mini Project Introduction
- Concept 29: Mini Project: Dermatologist AI
-
-
Module 07: Review: Text Sentiment Analysis
-
Lesson 01: Sentiment Analysis
In this lesson, Andrew Trask, the author of Grokking Deep Learning, will walk you through using neural networks for sentiment analysis.
- Concept 01: Introducing Andrew Trask
- Concept 02: Meet Andrew
- Concept 03: Materials
- Concept 04: The Notebooks
- Concept 05: Framing the Problem
- Concept 06: Mini Project 1
- Concept 07: Mini Project 1 Solution
- Concept 08: Transforming Text into Numbers
- Concept 09: Mini Project 2
- Concept 10: Mini Project 2 Solution
- Concept 11: Building a Neural Network
- Concept 12: Mini Project 3
- Concept 13: Mini Project 3 Solution
- Concept 14: Understanding Neural Noise
- Concept 15: Mini Project 4
- Concept 16: Understanding Inefficiencies in our Network
- Concept 17: Mini Project 5
- Concept 18: Mini Project 5 Solution
- Concept 19: Further Noise Reduction
- Concept 20: Mini Project 6
- Concept 21: Mini Project 6 Solution
- Concept 22: Analysis: What's Going on in the Weights?
- Concept 23: Conclusion
-
-
Module 08: More Deep Learning Models
-
Lesson 01: Fully-Convolutional Neural Networks & Semantic Segmentation
Get a high-level overview of how fully-convolutional neural networks work, and see how they can be used to classify every pixel in an image.
- Concept 01: Intro
- Concept 02: Why Fully Convolutional Networks (FCNs) ?
- Concept 03: Fully Convolutional Networks
- Concept 04: Fully Connected to 1x1 Convolution
- Concept 05: Transposed Convolutions
- Concept 06: Skip Connections
- Concept 07: FCNs In The Wild
- Concept 08: Bounding Boxes
- Concept 09: Semantic Segmentation
- Concept 10: Semantic Segmentation and FCN's
- Concept 11: Scene Understanding
- Concept 12: IoU
- Concept 13: IOU Example
- Concept 14: FCN-8 Architecture
- Concept 15: Outro
-
Lesson 02: 3D CNN's
Learn about the 3D CNN Architecture, which allows us to add time dimension to our usual x-y image inputs. 3D CNN's are currently being used to analyze and classify video clips.
-
-
Module 09: C++ Programming
-
Lesson 01: C++ Getting Started
The differences between C++ and Python and how to write C++ code.
- Concept 01: Introduction
- Concept 02: Lesson Overview
- Concept 03: Elecia White
- Concept 04: Why C++
- Concept 05: Python and C++ Comparison
- Concept 06: Static vs Dynamic Typing
- Concept 07: C++ - A Statically Typed Language
- Concept 08: Basic Data Types
- Concept 09: Floating versus Double [demonstration]
- Concept 10: Doubles are Bigger
- Concept 11: Common Errors and Error Messages
- Concept 12: C++ Functions
- Concept 13: Anatomy of a Function
- Concept 14: Multiple Outputs
- Concept 15: Two Functions Same Name
- Concept 16: Function Signatures 1
- Concept 17: Function Signatures 2
- Concept 18: If and Boolean Logic
- Concept 19: While and For Loops
- Concept 20: Switch Statement
- Concept 21: Libraries
- Concept 22: Forge on!
-
Lesson 02: C++ Vectors
To program matrix algebra operations and translate your Python code, you will need to use C++ Vectors. These vectors are similar to Python lists, but the syntax can be somewhat tricky.
- Concept 01: C++ Vectors
- Concept 02: Namespaces
- Concept 03: Python Lists vs. C++ Vectors
- Concept 04: Initializing Vector Values
- Concept 05: Vector Methods
- Concept 06: Vectors and For Loops
- Concept 07: Math and Vectors
- Concept 08: 1D Vector Playground
- Concept 09: 2D Vectors
- Concept 10: 2D Vectors and For Loops
- Concept 11: 2D Vector Playground
- Concept 12: Next Lesson
-
Lesson 03: Practical C++
Learn how to write C++ code on your own computer and compile it into a executable program without running into too many compilation errors.
-
Lesson 04: C++ Object Oriented Programming
Learn the syntax of C++ object oriented programming as well as some of the additional OOP features provided by the language.
- Concept 01: Introduction
- Concept 02: Python vs. C++
- Concept 03: Why use Object Oriented Programming?
- Concept 04: Using a Class in C++ [Demo]
- Concept 05: Explanation of the Main.cpp File
- Concept 06: Practice Using a Class
- Concept 07: Review: Anatomy of a Class
- Concept 08: Other Facets of C++ Classes
- Concept 09: Private and Public
- Concept 10: Header Files
- Concept 11: Inclusion Guards
- Concept 12: Implement a Class
- Concept 13: Class Variables
- Concept 14: Class Function Declarations
- Concept 15: Constructor Functions
- Concept 16: Set and Get Functions
- Concept 17: Matrix Functions
- Concept 18: Use an Inclusion Guard
- Concept 19: Instantiate an Object
- Concept 20: Running your Program Locally
-
Lesson 05: Python and C++ Speed
In this lesson, we'll compare the execution times of C++ and Python programs.
-
Lesson 06: C++ Intro to Optimization
Optimizing C++ involves understanding how a computer actually runs your programs. You'll learn how C++ uses the CPU and RAM to execute your code and get a sense for what can slow things down.
- Concept 01: Course Introduction
- Concept 02: Empathize with the Computer
- Concept 03: Intro to Computer Hardware
- Concept 04: Embedded Terminal Explanation
- Concept 05: Demo: Machine Code
- Concept 06: Assembly Language
- Concept 07: Binary
- Concept 08: Demo: Binary
- Concept 09: Demo: Binary Floats
- Concept 10: Memory and the CPU
- Concept 11: Demo: Stack vs Heap
- Concept 12: Outro
-
Lesson 07: C++ Optimization Practice
Now you understand how C++ programs execute. It's time to learn specific optimization techniques and put them into practice. This lesson will prepare you for the lesson's code optimization project.
- Concept 01: Introduction
- Concept 02: Software Development and Optimization
- Concept 03: Optimization Techniques
- Concept 04: Dead Code
- Concept 05: Exercise: Remove Dead Code
- Concept 06: If Statements
- Concept 07: Exercise: If Statements
- Concept 08: For Loops
- Concept 09: Exercise: For Loops
- Concept 10: Intermediate Variables
- Concept 11: Exercise: Intermediate Variables
- Concept 12: Vector Storage
- Concept 13: Exercise: Vector Storage
- Concept 14: References
- Concept 15: Exercise: References
- Concept 16: Sebastian's Synchronization Story
- Concept 17: Static Keyword
- Concept 18: Exercise: Static Keyword
- Concept 19: Speed Challenge
-
Part 09 (Elective): NLPND
-
Module 01: Introduction to Natural Language Processing
-
Lesson 01: Welcome to Natural Language Processing
Welcome to the Natural Language Processing Nanodegree program!
-
Lesson 02: Intro to NLP
Arpan will give you an overview of how to build a Natural Language Processing pipeline.
- Concept 01: Introducing Arpan
- Concept 02: NLP Overview
- Concept 03: Structured Languages
- Concept 04: Grammar
- Concept 05: Unstructured Text
- Concept 06: Counting Words
- Concept 07: Context Is Everything
- Concept 08: NLP and Pipelines
- Concept 09: How NLP Pipelines Work
- Concept 10: Text Processing
- Concept 11: Feature Extraction
- Concept 12: Modeling
-
Lesson 03: Text Processing
Learn to prepare text obtained from different sources for further processing, by cleaning, normalizing and splitting it into individual words or tokens.
- Concept 01: Text Processing
- Concept 02: Coding Exercises
- Concept 03: Introduction to GPU Workspaces
- Concept 04: Workspaces: Best Practices
- Concept 05: Workspace
- Concept 06: Capturing Text Data
- Concept 07: Cleaning
- Concept 08: Normalization
- Concept 09: Tokenization
- Concept 10: Stop Word Removal
- Concept 11: Part-of-Speech Tagging
- Concept 12: Named Entity Recognition
- Concept 13: Stemming and Lemmatization
- Concept 14: Summary
-
Lesson 04: Spam Classifier with Naive Bayes
In this section, you'll learn how to build a spam e-mail classifier using the naive Bayes algorithm.
- Concept 01: Intro
- Concept 02: Guess the Person
- Concept 03: Known and Inferred
- Concept 04: Guess the Person Now
- Concept 05: Bayes Theorem
- Concept 06: Quiz: False Positives
- Concept 07: Solution: False Positives
- Concept 08: Bayesian Learning 1
- Concept 09: Bayesian Learning 2
- Concept 10: Bayesian Learning 3
- Concept 11: Naive Bayes Algorithm 1
- Concept 12: Naive Bayes Algorithm 2
- Concept 13: Building a Spam Classifier
- Concept 14: Project
- Concept 15: Spam Classifier - Workspace
- Concept 16: Outro
-
Lesson 05: Part of Speech Tagging with HMMs
Luis will give you an overview of several part-of-speech tagging, including a deeper dive on hidden Markov models.
- Concept 01: Intro
- Concept 02: Part of Speech Tagging
- Concept 03: Lookup Table
- Concept 04: Bigrams
- Concept 05: When bigrams won't work
- Concept 06: Hidden Markov Models
- Concept 07: Quiz: How many paths?
- Concept 08: Solution: How many paths
- Concept 09: Quiz: How many paths now?
- Concept 10: Quiz: Which path is more likely?
- Concept 11: Solution: Which path is more likely?
- Concept 12: Viterbi Algorithm Idea
- Concept 13: Viterbi Algorithm
- Concept 14: Further Reading
- Concept 15: Outro
-
Lesson 06: (Optional) IBM Watson Bookworm Lab
Learn how to build a simple question-answering agent using IBM Watson.
-
-
Module 02: Computing with Natural Language
-
Lesson 01: Feature extraction and embeddings
Transform text using methods like Bag-of-Words, TF-IDF, Word2Vec and GloVE to extract features that you can use in machine learning models.
-
Lesson 02: Topic Modeling
In this section, you'll learn to split a collection of documents into topics using Latent Dirichlet Analysis (LDA). In the lab, you'll be able to apply this model to a dataset of news articles.
- Concept 01: Intro
- Concept 02: References
- Concept 03: Bag of Words
- Concept 04: Latent Variables
- Concept 05: Matrix Multiplication
- Concept 06: Matrices
- Concept 07: Quiz: Picking Topics
- Concept 08: Solution: Picking Topics
- Concept 09: Beta Distributions
- Concept 10: Dirichlet Distributions
- Concept 11: Latent Dirichlet Allocation
- Concept 12: Sample a Topic
- Concept 13: Sample a Word
- Concept 14: Combining the Models
- Concept 15: Outro
- Concept 16: Notebook: Topic Modeling
- Concept 17: [SOLUTION] Topic Modeling
- Concept 18: Next Steps
-
Lesson 03: Sentiment Analysis
Learn about using several machine learning classifiers, including Recurrent Neural Networks, to predict the sentiment in text. Apply this to a dataset of movie reviews.
- Concept 01: Intro
- Concept 02: Sentiment Analysis with a Regular Classifier
- Concept 03: Notebook: Sentiment Analysis with a regular classifier
- Concept 04: [SOLUTION]: Sentiment Analysis with a regular clas
- Concept 05: Sentiment Analysis with RNN
- Concept 06: Notebook: Sentiment Analysis with an RNN
- Concept 07: [SOLUTION]: Sentiment Analysis with an RNN
- Concept 08: Optional Material
- Concept 09: Outro
-
Lesson 04: Sequence to Sequence
Here you'll learn about a specific architecture of RNNs for generating one sequence from another sequence. These RNNs are useful for chatbots, machine translation, and more!
-
Lesson 05: Deep Learning Attention
Attention is one of the most important recent innovations in deep learning. In this section, you'll learn attention, and you'll go over a basic implementation of it in the lab.
- Concept 01: Introduction to Attention
- Concept 02: Sequence to Sequence Recap
- Concept 03: Encoding -- Attention Overview
- Concept 04: Decoding -- Attention Overview
- Concept 05: Attention Overview
- Concept 06: Attention Encoder
- Concept 07: Attention Decoder
- Concept 08: Attention Encoder & Decoder
- Concept 09: Bahdanau and Luong Attention
- Concept 10: Multiplicative Attention
- Concept 11: Additive Attention
- Concept 12: Additive and Multiplicative Attention
- Concept 13: Computer Vision Applications
- Concept 14: NLP Application: Google Neural Machine Translation
- Concept 15: Other Attention Methods
- Concept 16: The Transformer and Self-Attention
- Concept 17: Notebook: Attention Basics
- Concept 18: [SOLUTION]: Attention Basics
- Concept 19: Outro
-
Lesson 06: RNN Keras Lab
This section will prepare you for the Machine Translation project. Here you will get hands-on practice with RNNs in Keras.
-
Lesson 07: Cloud Computing Setup Instructions
Overview of the steps to configure remote environment for GPU-accelerated training (Note: NLPND does not include AWS credits)
-
-
Module 03: Communicating with Natural Language
-
Lesson 01: Intro to Voice User Interfaces
Get acquainted with the principles and applications of VUI, and get introduced to Alexa skills.
-
Lesson 02: (Optional) Alexa History Skill
Build your own Alexa skill and deploy it!
-
Lesson 03: Speech Recognition
Learn how an ASR pipeline works.
- Concept 01: Intro
- Concept 02: Challenges in ASR
- Concept 03: Signal Analysis
- Concept 04: References: Signal Analysis
- Concept 05: Quiz: FFT
- Concept 06: Feature Extraction with MFCC
- Concept 07: References: Feature Extraction
- Concept 08: Quiz: MFCC
- Concept 09: Phonetics
- Concept 10: References: Phonetics
- Concept 11: Quiz: Phonetics
- Concept 12: Voice Data Lab Introduction
- Concept 13: Lab: Voice Data
- Concept 14: Acoustic Models and the Trouble with Time
- Concept 15: HMMs in Speech Recognition
- Concept 16: Language Models
- Concept 17: N-Grams
- Concept 18: Quiz: N-Grams
- Concept 19: References: Traditional ASR
- Concept 20: A New Paradigm
- Concept 21: Deep Neural Networks as Speech Models
- Concept 22: Connectionist Tempora Classification (CTC)
- Concept 23: References: Deep Neural Network ASR
- Concept 24: Outro
-
-
Module 04: Review: Recurrent Neural Networks
-
Lesson 01: Recurrent Neural Networks
Ortal will introduce Recurrent Neural Networks (RNNs), which are machine learning models that are able to recognize and act on sequences of inputs.
- Concept 01: Introducing Ortal
- Concept 02: RNN Introduction
- Concept 03: RNN History
- Concept 04: RNN Applications
- Concept 05: Feedforward Neural Network-Reminder
- Concept 06: The Feedforward Process
- Concept 07: Feedforward Quiz
- Concept 08: Backpropagation- Theory
- Concept 09: Backpropagation - Example (part a)
- Concept 10: Backpropagation- Example (part b)
- Concept 11: Backpropagation Quiz
- Concept 12: RNN (part a)
- Concept 13: RNN (part b)
- Concept 14: RNN- Unfolded Model
- Concept 15: Unfolded Model Quiz
- Concept 16: RNN- Example
- Concept 17: Backpropagation Through Time (part a)
- Concept 18: Backpropagation Through Time (part b)
- Concept 19: Backpropagation Through Time (part c)
- Concept 20: BPTT Quiz 1
- Concept 21: BPTT Quiz 2
- Concept 22: BPTT Quiz 3
- Concept 23: Some more math
- Concept 24: RNN Summary
- Concept 25: From RNN to LSTM
- Concept 26: Wrap Up
-
Lesson 02: Long Short-Term Memory Networks (LSTM)
Luis explains Long Short-Term Memory Networks (LSTM), and similar architectures which have the benefits of preserving long term memory.
- Concept 01: Intro to LSTM
- Concept 02: RNN vs LSTM
- Concept 03: Basics of LSTM
- Concept 04: Architecture of LSTM
- Concept 05: The Learn Gate
- Concept 06: The Forget Gate
- Concept 07: The Remember Gate
- Concept 08: The Use Gate
- Concept 09: Putting it All Together
- Concept 10: Quiz
- Concept 11: Other architectures
- Concept 12: Outro LSTM
-
-
Module 05: Review: Keras
-
Lesson 01: Keras
In this section you'll get a hands-on introduction to Keras. You'll learn to apply it to analyze movie reviews.
-
-
Module 06: Sentiment Analysis Extras
-
Lesson 01: Sentiment Analysis with Andrew Trask
- Concept 01: Meet Andrew
- Concept 02: Materials
- Concept 03: The Notebooks
- Concept 04: Framing the Problem
- Concept 05: Mini Project 1
- Concept 06: Mini Project 1 Solution
- Concept 07: Transforming Text into Numbers
- Concept 08: Mini Project 2
- Concept 09: Mini Project 2 Solution
- Concept 10: Building a Neural Network
- Concept 11: Mini Project 3
- Concept 12: Mini Project 3 Solution
- Concept 13: Understanding Neural Noise
- Concept 14: Mini Project 4
- Concept 15: Understanding Inefficiencies in our Network
- Concept 16: Mini Project 5
- Concept 17: Mini Project 5 Solution
- Concept 18: Further Noise Reduction
- Concept 19: Mini Project 6
- Concept 20: Mini Project 6 Solution
- Concept 21: Analysis: What's Going on in the Weights?
- Concept 22: Conclusion
-
-
Module 07: Review: TensorFlow
-
Lesson 01: TensorFlow
In this section you'll get a hands-on introduction to TensorFlow, Google's deep learning framework, and you'll be able to apply it on an image dataset.
- Concept 01: Intro
- Concept 02: Installing TensorFlow
- Concept 03: Hello, Tensor World!
- Concept 04: Quiz: TensorFlow Linear Function
- Concept 05: Quiz: TensorFlow Softmax
- Concept 06: Quiz: TensorFlow Cross Entropy
- Concept 07: Quiz: Mini-batch
- Concept 08: Epochs
- Concept 09: Pre-Lab: NotMNIST in TensorFlow
- Concept 10: Lab: NotMNIST in TensorFlow
- Concept 11: Two-layer Neural Network
- Concept 12: Quiz: TensorFlow ReLUs
- Concept 13: Deep Neural Network in TensorFlow
- Concept 14: Save and Restore TensorFlow Models
- Concept 15: Finetuning
- Concept 16: Quiz: TensorFlow Dropout
- Concept 17: Outro
-
-
Module 08: Embeddings & Word2Vec
-
Lesson 01: Embeddings and Word2Vec
In this lesson, you'll learn about embeddings in neural networks by implementing the word2vec model.
- Concept 01: Additional NLP Lessons
- Concept 02: Embeddings Intro
- Concept 03: Implementing Word2Vec
- Concept 04: Subsampling Solution
- Concept 05: Making Batches
- Concept 06: Batches Solution
- Concept 07: Building the Network
- Concept 08: Negative Sampling
- Concept 09: Building the Network Solution
- Concept 10: Training Results
-